2,203 research outputs found

    A new heap game

    Full text link
    Given k≄3k\ge 3 heaps of tokens. The moves of the 2-player game introduced here are to either take a positive number of tokens from at most k−1k-1 heaps, or to remove the {\sl same} positive number of tokens from all the kk heaps. We analyse this extension of Wythoff's game and provide a polynomial-time strategy for it.Comment: To appear in Computer Games 199

    Fast Algorithm for Partial Covers in Words

    Get PDF
    A factor uu of a word ww is a cover of ww if every position in ww lies within some occurrence of uu in ww. A word ww covered by uu thus generalizes the idea of a repetition, that is, a word composed of exact concatenations of uu. In this article we introduce a new notion of α\alpha-partial cover, which can be viewed as a relaxed variant of cover, that is, a factor covering at least α\alpha positions in ww. We develop a data structure of O(n)O(n) size (where n=∣w∣n=|w|) that can be constructed in O(nlog⁥n)O(n\log n) time which we apply to compute all shortest α\alpha-partial covers for a given α\alpha. We also employ it for an O(nlog⁥n)O(n\log n)-time algorithm computing a shortest α\alpha-partial cover for each α=1,2,
,n\alpha=1,2,\ldots,n

    Game saturation of intersecting families

    Get PDF
    We consider the following combinatorial game: two players, Fast and Slow, claim kk-element subsets of [n]={1,2,...,n}[n]=\{1,2,...,n\} alternately, one at each turn, such that both players are allowed to pick sets that intersect all previously claimed subsets. The game ends when there does not exist any unclaimed kk-subset that meets all already claimed sets. The score of the game is the number of sets claimed by the two players, the aim of Fast is to keep the score as low as possible, while the aim of Slow is to postpone the game's end as long as possible. The game saturation number is the score of the game when both players play according to an optimal strategy. To be precise we have to distinguish two cases depending on which player takes the first move. Let gsatF(In,k)gsat_F(\mathbb{I}_{n,k}) and gsatS(In,k)gsat_S(\mathbb{I}_{n,k}) denote the score of the saturation game when both players play according to an optimal strategy and the game starts with Fast's or Slow's move, respectively. We prove that Ωk(nk/3−5)≀gsatF(In,k),gsatS(In,k)≀Ok(nk−k/2)\Omega_k(n^{k/3-5}) \le gsat_F(\mathbb{I}_{n,k}),gsat_S(\mathbb{I}_{n,k}) \le O_k(n^{k-\sqrt{k}/2}) holds

    A Cauchy-Dirac delta function

    Full text link
    The Dirac delta function has solid roots in 19th century work in Fourier analysis and singular integrals by Cauchy and others, anticipating Dirac's discovery by over a century, and illuminating the nature of Cauchy's infinitesimals and his infinitesimal definition of delta.Comment: 24 pages, 2 figures; Foundations of Science, 201

    Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond

    Full text link
    Many historians of the calculus deny significant continuity between infinitesimal calculus of the 17th century and 20th century developments such as Robinson's theory. Robinson's hyperreals, while providing a consistent theory of infinitesimals, require the resources of modern logic; thus many commentators are comfortable denying a historical continuity. A notable exception is Robinson himself, whose identification with the Leibnizian tradition inspired Lakatos, Laugwitz, and others to consider the history of the infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies, Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly demonstrating the inconsistency of reasoning with historical infinitesimal magnitudes. We argue that Robinson, among others, overestimates the force of Berkeley's criticisms, by underestimating the mathematical and philosophical resources available to Leibniz. Leibniz's infinitesimals are fictions, not logical fictions, as Ishiguro proposed, but rather pure fictions, like imaginaries, which are not eliminable by some syncategorematic paraphrase. We argue that Leibniz's defense of infinitesimals is more firmly grounded than Berkeley's criticism thereof. We show, moreover, that Leibniz's system for differential calculus was free of logical fallacies. Our argument strengthens the conception of modern infinitesimals as a development of Leibniz's strategy of relating inassignable to assignable quantities by means of his transcendental law of homogeneity.Comment: 69 pages, 3 figure

    Measurement of L-shell emission from mid-Z targets under non-LTE conditions using Transmission Grating Spectrometer and DANTE power diagnostics

    Get PDF
    ProducciĂłn CientĂ­ficaIn this work, we present the measurement of L-band emission from buried Sc/V targets in experiments performed at the OMEGA laser facility. The goal of these experiments was to study non-local thermodynamic equilibrium plasmas and benchmark atomic physics codes. The L-band emission was measured simultaneously by the time resolved DANTE power diagnostic and the recently fielded time integrated Soreq-Transmission Grating Spectrometer (TGS) diagnostic. The TGS measurement was used to support the spectral reconstruction process needed for the unfolding of the DANTE data. The Soreq-TGS diagnostic allows for broadband spectral measurement in the 120 eV–2000 eV spectral band, covering L- and M-shell emission of mid- and high-Z elements, with spectral resolution λ/Δλ = 8–30 and accuracy better than 25%. The Soreq-TGS diagnostic is compatible with ten-inch-manipulator platforms and can be used for a wide variety of high energy density physics, laboratory astrophysics, and inertial confinement fusion experiments

    Ten Misconceptions from the History of Analysis and Their Debunking

    Full text link
    The widespread idea that infinitesimals were "eliminated" by the "great triumvirate" of Cantor, Dedekind, and Weierstrass is refuted by an uninterrupted chain of work on infinitesimal-enriched number systems. The elimination claim is an oversimplification created by triumvirate followers, who tend to view the history of analysis as a pre-ordained march toward the radiant future of Weierstrassian epsilontics. In the present text, we document distortions of the history of analysis stemming from the triumvirate ideology of ontological minimalism, which identified the continuum with a single number system. Such anachronistic distortions characterize the received interpretation of Stevin, Leibniz, d'Alembert, Cauchy, and others.Comment: 46 pages, 4 figures; Foundations of Science (2012). arXiv admin note: text overlap with arXiv:1108.2885 and arXiv:1110.545

    Huntington’s Disease iPSC-Derived Brain Microvascular Endothelial Cells Reveal WNT-Mediated Angiogenic and Blood-Brain Barrier Deficits

    Get PDF
    Brain microvascular endothelial cells (BMECs) are an essential component of the blood-brain barrier (BBB) that shields the brain against toxins and immune cells. While BBB dysfunction exists in neurological disorders, including Huntington's disease (HD), it is not known if BMECs themselves are functionally compromised to promote BBB dysfunction. Further, the underlying mechanisms of BBB dysfunction remain elusive given limitations with mouse models and post-mortem tissue to identify primary deficits. We undertook a transcriptome and functional analysis of human induced pluripotent stem cell (iPSC)-derived BMECs (iBMEC) from HD patients or unaffected controls. We demonstrate that HD iBMECs have intrinsic abnormalities in angiogenesis and barrier properties, as well as in signaling pathways governing these processes. Thus, our findings provide an iPSC-derived BBB model for a neurodegenerative disease and demonstrate autonomous neurovascular deficits that may underlie HD pathology with implications for therapeutics and drug delivery.American Heart Association (12PRE10410000)American Heart Association (CIRMTG2-01152)National Institutes of Health (U.S.) (NIHNS089076
    • 

    corecore